A normalized gradient algorithm for an adaptive recurrent perceptron

نویسندگان

  • Jonathon A. Chambers
  • Warren Sherliker
  • Danilo P. Mandic
چکیده

A normalized algorithm for on-line adaptation of a recurrent perceptron is derived. The algorithm builds upon the normalized backpropagation (NBP) algorithm for feedforward neural networks, and provides an adaptive learning rate and normalization for a recurrent perceptron learning algorithm. The algorithm is based upon local linearization about the current point in the state-space of the network. Such a learning rate is normalized by the squared norm of the gradient at the neuron, which extends the notion of normalized linear algorithms to the nonlinear case.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Normalized Gradient with Adaptive Stepsize Method for Deep Neural Network Training

In this paper, we propose a generic and simple algorithmic framework for first order optimization. The framework essentially contains two consecutive steps in each iteration: 1) computing and normalizing the mini-batch stochastic gradient; 2) selecting adaptive step size to update the decision variable (parameter) towards the negative of the normalized gradient. We show that the proposed approa...

متن کامل

A Nonlinear Neural FIR Filter With An Adaptive Activation Function

An adaptive amplitude normalized nonlinear gradient descent (AANNGD) algorithm for the class of nonlinear finite impulse response (FIR) adaptive filters (dynamical perceptron) is introduced. This is achieved by making the amplitude of the nonlinear activation function gradient adaptive. The proposed learning algorithm is suitable for processing of nonlinear and nonstationary signals with a larg...

متن کامل

Dynamical multilayer neural networks that learn continuous trajectories

The feed-forward multilayer networks (perceptrons, radial basis function networks (RBF), probabilistic networks, etc.) are currently used as „static systems“ in pattern recognition, speech generation, identification and control, prediction, etc. (see, e. g. [1]). Theoretical works by several researchers, including [2] and [3] have proved that, even with one hidden layer, a perceptron neural net...

متن کامل

Evolving Recurrent Bilinear Perceptrons for Time Series Prediction

The development of efficient algorithms, based on optimization techniques, for learning the weights and architecture of recurrent neural networks (RNN), has recently, received much attention. Many of the proposed training methods suffer from certain fundamental drawbacks. Firstly, it is difficult to devise efficient learning algorithms that guarantee stability of the overall system. Secondly, t...

متن کامل

A Family of Variable Step-Size Normalized Subband Adaptive Filter Algorithms Using Statistics of System Impulse Response

This paper presents a new variable step-size normalized subband adaptive filter (VSS-NSAF) algorithm. The proposed algorithm uses the prior knowledge of the system impulse response statistics and the optimal step-size vector is obtained by minimizing the mean-square deviation(MSD). In comparison with NSAF, the VSS-NSAF algorithm has faster convergence speed and lower MSD. To reduce the computa...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2000